88

7

The Transmission of Information

Averaging over all ii and j j,

upper I overbar equals sigma summation Underscript i Endscripts sigma summation Underscript j Endscripts p left parenthesis i comma j right parenthesis log StartFraction p left parenthesis i vertical bar j right parenthesis Over p left parenthesis i right parenthesis EndFraction comma ¯I =

Σ

i

Σ

j

p(i, j) log p(i| j)

p(i)

,

(7.18)

but since p left parenthesis i comma j right parenthesis equals p left parenthesis i right parenthesis p left parenthesis j vertical bar i right parenthesis equals p left parenthesis j right parenthesis p left parenthesis i vertical bar j right parenthesisp(i, j) = p(i)p( j|i) = p( j)p(i| j) (cf. Sect. 9.2.2),

upper I overbar equals sigma summation Underscript i Endscripts sigma summation Underscript j Endscripts p left parenthesis i comma j right parenthesis log StartFraction p left parenthesis i comma j right parenthesis Over p left parenthesis i right parenthesis p left parenthesis j right parenthesis EndFraction period ¯I =

Σ

i

Σ

j

p(i, j) log

p(i, j)

p(i)p( j) .

(7.19)

If i equals ji = j always, then we recover the Shannon index (Eq. 6.5). If the two are statisti-

cally independent, upper I overbar equals 0 ¯I = 0.

From our definition of p left parenthesis i comma j right parenthesisp(i, j), we can write the posterior probability as

p left parenthesis i comma j right parenthesis equals StartFraction p left parenthesis i right parenthesis Over p left parenthesis j right parenthesis EndFraction p left parenthesis j comma i right parenthesis periodp(i, j) = p(i)

p( j) p( j, i) .

(7.20)

Shannon’s fundamental theorem for a discrete channel with noise proves that if

the channel capacity isscript upper CC and the source transmission rate isscript upper RR, then ifscript upper R less than or equals script upper CRC, there

exists a coding system such that the source output can be transmitted through the

channel with an arbitrarily small frequency of errors. The capacity of a noisy channel

is defined as

script upper C Subscript noisy Baseline equals max left parenthesis upper I left parenthesis x right parenthesis minus upper E right parenthesis commaCnoisy = max(I (x)E) ,

(7.21)

the maximization being over all sources that might be used as input to the channel.

7.6

Error Correction

Suppose a binary transmission channel had a 20% chance of transmitting an incorrect

signal; hence, a message sent as “0110101110” might appear as “1100101110”.

An easy way to render the system immune from such noise would be to repeat

each signal threefold and incorporate a majority detector in the receiver. Hence,

the signal would be sent as “0001111110001110001111–11111000” and received

as “001011011000110000101111111100” (say), but majority detection would still

enable the signal to be correctly restored. The penalty, of course, is that the channel

capacity is reduced to a third of its previous value.

Many physical devices are so designed to be immune, to a certain degree, to

random fluctuations in the physical quantities encoding information. In a digital

device, zero voltage applied to a terminal represents the digit “0”, and 1 V (say)

represents the digit “1”. In practice, any voltage up to about 0.5 will be interpreted

as zero, and all voltages above 0.5 will be interpreted as 1.0 (see Fig. 7.3).